Maximum Entropy Estimates for Risk-Neutral Probability Measures with Non-Strictly-Convex Data

نویسندگان

  • Christopher J. Bose
  • Rua Murray
چکیده

This article investigates use of the Principle of Maximum Entropy for approximation of the risk-neutral probability density on the price of a financial asset as inferred from market prices on associated options. The usual strict convexity assumption on the market-price to strike-price function is relaxed, provided one is willing to accept a partially supported risk-neutral density. This provides a natural and useful extension of the standard theory. We present a rigorous analysis of the related optimization problem via convex duality and constraint qualification on both bounded and unbounded price domains. The relevance of this work for applications is in explaining precisely the consequences of any gap between convexity and strict convexity in the price function. The computational feasibility of the method and analytic consequences arising from non-strictly-convex price functions are illustrated with a numerical example.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Dissipation of Relative Entropy by Diffusion Flows

Abstract: Given a probability measure, we consider the diffusion flows of probability measures associated with the partial differential equation (PDE) of Fokker–Planck. Our flows of the probability measures are defined as the solution of the Fokker–Planck equation for the same strictly convex potential, which means that the flows have the same equilibrium. Then, we shall investigate the time de...

متن کامل

Optimal measures and Markov transition kernels

We study optimal solutions to an abstract optimization problem for measures, which is a generalization of classical variational problems in information theory and statistical physics. In the classical problems, information and relative entropy are defined using the Kullback-Leibler divergence, and for this reason optimal measures belong to a oneparameter exponential family. Measures within such...

متن کامل

Optimal measures and Markov transition kernels ∗ Roman

We study optimal solutions to an abstract optimization problem for measures, which is a generalization of classical variational problems in information theory and statistical physics. In the classical problems, information and relative entropy are defined using the Kullback-Leibler divergence, and for this reason optimal measures belong to a oneparameter exponential family. Measures within such...

متن کامل

Construction of polygonal interpolants: a maximum entropy approach

In this paper, we establish a link between maximizing (information-theoretic) entropy and the construction of polygonal interpolants. The determination of shape functions on n-gons (n > 3) leads to a non-unique under-determined system of linear equations. The barycentric co-ordinates i , which form a partition of unity, are associated with discrete probability measures, and the linear reproduci...

متن کامل

Determination of Maximum Bayesian Entropy Probability Distribution

In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • J. Optimization Theory and Applications

دوره 161  شماره 

صفحات  -

تاریخ انتشار 2014